A Dimension-Independent discriminant between distributions
نویسندگان
چکیده
Henze-Penrose divergence is a non-parametric divergence measure that can be used to estimate a bound on the Bayes error in a binary classification problem. In this paper, we show that a crossmatch statistic based on optimal weighted matching can be used to directly estimate Henze-Penrose divergence. Unlike an earlier approach based on the Friedman-Rafsky minimal spanning tree statistic, the proposed method is dimension-independent. The new approach is evaluated using simulation and applied to real datasets to obtain Bayes error estimates.
منابع مشابه
Defect Classification in NDT Applications using Time Frequency Features, LDA, and a KNN Classifier
In this paper, we develop a new approach for detecting defects in steel. We show that given the time varying nature of the acquired signals, we need to use nonstationay signal processing approaches. In particular, we focus on the performance of different time-frequency distributions (TFD). We show that the extraction of robust features from such TF distributions can lead to excellent classifica...
متن کاملDiscriminant Analysis with High Dimensional von Mises - Fisher Distributions
This paper extends previous work in discriminant analysis with von Mises-Fisher distributions (e. g., Morris and Laycock, Biometrika, 1974) to general dimension, allowing computation of misclassification probabilities. The main result is the probability distribution of the cosine transformation of a von Mises-Fisher distribution, that is, the random variable , where , satisfying , is a random d...
متن کاملFADA: An Efficient Dimension Reduction Scheme for Image Classification
This paper develops a novel and efficient dimension reduction scheme--Fast Adaptive Discriminant Analysis (FADA). FADA can find a good projection with adaptation to different sample distributions and discover the classification in the subspace with naïve Bayes classifier. FADA overcomes the high computational cost problem of current Adaptive Discriminant Analysis (ADA) and also alleviates the o...
متن کاملNon-asymptotic Analysis of Compressive Fisher Discriminants in terms of the Effective Dimension
We provide a non-asymptotic analysis of the generalisation error of compressive Fisher linear discriminant (FLD) classification that is dimension free under mild assumptions. Our analysis includes the effects that random projection has on classification performance under covariance model misspecification, as well as various good and bad effects of random projections that contribute to the overa...
متن کاملFeature space locality constraint for kernel based nonlinear discriminant analysis
Subspace learning is an important approach in pattern recognition. Nonlinear discriminant analysis (NDA), due to its capability of describing nonlinear manifold structure of samples, is considered to be more powerful to undertake classification tasks in image related problems. In kernel based NDA representation, there are three spaces involved, i.e., original data space, implicitly mapped high ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1802.04497 شماره
صفحات -
تاریخ انتشار 2018